Wyner's Common Information under Rényi Divergence Measures
نویسندگان
چکیده
We study a generalized version of Wyner’s common information problem (also coined the distributed sources simulation problem). The original common information problem consists in understanding the minimum rate of the common input to independent processors to generate an approximation of a joint distribution when the distance measure used to quantify the discrepancy between the synthesized and target distributions is the normalized relative entropy. Our generalization involves changing the distance measure to the unnormalized and normalized Rényi divergences of order α = 1+ s ∈ [0, 2]. We show that the minimum rate needed to ensure the Rényi divergences between the distribution induced by a code and the target distribution vanishes remains the same as the one in Wyner’s setting, except when the order α = 1+s = 0. This implies that Wyner’s common information is rather robust to the choice of distance measure employed. As a by product of the proofs used to the establish the above results, the exponential strong converse for the common information problem under the total variation distance measure is established. Index Terms Wyner’s common information, Distributed source simulation, Rényi divergence, Total variation distance, Exponential strong converse
منابع مشابه
IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Reverse Pinsker Inequalities
New upper bounds on the relative entropy are derived as a function of the total variation distance. One bound refines an inequality by Verdú for general probability measures. A second bound improves the tightness of an inequality by Csiszár and Talata for arbitrary probability measures that are defined on a common finite set. The latter result is further extended, for probability measures on a ...
متن کاملOn Rényi Divergence Measures for Continuous Alphabet Sources
The idea of ‘probabilistic distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, has been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. While the closely related concept of Rényi ent...
متن کاملOn Reverse Pinsker Inequalities
New upper bounds on the relative entropy are derived as a function of the total variation distance. One bound refines an inequality by Verdú for general probability measures. A second bound improves the tightness of an inequality by Csiszár and Talata for arbitrary probability measures that are defined on a common finite set. The latter result is further extended, for probability measures on a ...
متن کاملRényi divergence measures for commonly used univariate continuous distributions
Probabilistic ‘distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, have been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. This paper presents closed-form expressions for the Rényi...
متن کاملDifferent quantum f -divergences and the reversibility of quantum operations
The concept of classical f -divergences gives a unified framework to construct and study measures of dissimilarity of probability distributions; special cases include the relative entropy and the Rényi divergences. Various quantum versions of this concept, and more narrowly, the concept of Rényi divergences, have been introduced in the literature with applications in quantum information theory;...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1709.02168 شماره
صفحات -
تاریخ انتشار 2017